So are we ready for a kickoff?
Yes we are.
We are continuing now the SVT SVD topic and then we gonna start with a pre-processing
chapter.
So yesterday we ended up with one problem that is very interesting.
We looked at the optimization problem that we have a matrix mapping A. We have vector
X that we are looking for.
The vector X should have unit length in terms of the L2 norm and we look for the X vector
that has the minimum length of AX.
In the ideal case, if we have for instance a non-trivial null space, there exists a vector
X that is non-zero, not the zero vector, that is mapped by A to the zero vector.
So for instance, if you have to solve the problem AX is zero and you have to compute
the null space, that's basically what you have to do, right?
If I give you this problem AX is zero, compute the null space of the matrix to exactly solve
this problem.
So you run your mechanisms and your methods that you learned in first semester and you
will end up with a vector X if there is a non-trivial null space or with a set of vectors
where you know these vectors span the null space.
Unfortunately we are here in the field of image processing, unfortunately from your
point of view.
If you compute A out of signals, the A will be corrupted by noise, it will be corrupted
by noise and that means that there will be for sure no non-trivial null space and the
question is how do you compute it?
And usually in engineering we say we try to get as close as possible to the zero.
And if we want to get as close as possible to zero that does more or less mean that we
look for an X that minimizes here the length of the final vector.
If A is not corrupted we get exactly zero, if A is somehow corrupted we get something
that is close to zero.
If you don't get the core idea here don't worry we will see tons of examples where we
will use this technique to solve this type of optimization problem.
In the extreme we compute here the null space of the matrix.
And the solution is very simple if you know about SVD.
If you know Gauss elimination you have your nice triangulation stuff and so on but it
will not work for practical examples.
It just works for academic examples that means for the written exam that you have to do you
have usually three by three matrices or if your professor is challenging you a five by
five matrix but that's it.
If you know SVD how can we solve it?
Anybody has an idea?
Of course.
I mean if we have a non-trivial null space that means the smallest singular value or
the smallest singular values are very close to zero.
And we know which vectors out of the V matrix spend the null space.
These are the column vectors of V that are associated with the singular values that are
zero or close to zero.
That's all.
So what do we need to know?
We do not need to know that much.
We have to know how to start MATLAB.
We have to know how to type in SVD of the matrix and then we have to know how to read
Presenters
Zugänglich über
Offener Zugang
Dauer
01:26:06 Min
Aufnahmedatum
2010-10-26
Hochgeladen am
2011-04-11 13:53:28
Sprache
de-DE